# Chinese pre-training
Chinesebert Base
ChineseBERT is a Chinese pre-trained model that integrates glyph and pinyin information, optimized for Chinese text processing.
Large Language Model
Transformers Chinese

C
iioSnail
118
7
Chinese Lert Small
Apache-2.0
LERT is a linguistics theory-driven pre-trained language model designed to enhance model performance through linguistic knowledge.
Large Language Model
Transformers Chinese

C
hfl
538
12
Chinese Bert Wwm Finetuned Jd
Apache-2.0
This model is a fine-tuned version based on hfl/chinese-bert-wwm on an unknown dataset, suitable for Chinese text processing tasks.
Large Language Model
Transformers

C
wangmiaobeng
24
0
Chinese Roberta L 6 H 512
A medium version of the Chinese RoBERTa model series pre-trained by UER-py, trained on CLUECorpusSmall corpus, suitable for various Chinese NLP tasks.
Large Language Model Chinese
C
uer
19
0
Mengzi Bert Base
Apache-2.0
BERT model pre-trained on 300G Chinese corpus using MLM, POS, and SOP tasks
Large Language Model
Transformers Chinese

M
Langboat
438
37
Mengzi T5 Base
Apache-2.0
A lightweight intelligent pre-trained model based on a 300G Chinese corpus
Large Language Model
Transformers Chinese

M
Langboat
6,073
55
Featured Recommended AI Models